Revisiting Negation in Neural Machine Translation

نویسندگان

چکیده

In this paper, we evaluate the translation of negation both automatically and manually, in English–German (EN–DE) English– Chinese (EN–ZH). We show that ability neural machine (NMT) models to translate has improved with deeper more advanced networks, although performance varies between language pairs directions. The accuracy manual evaluation EN?DE, DE?EN, EN?ZH, ZH?EN is 95.7%, 94.8%, 93.4%, 91.7%, respectively. addition, under-translation most significant error type NMT, which contrasts diverse profile previously observed for statistical translation. To better understand root negation, study model’s information flow training data. While our analysis does not reveal any deficiencies could be used detect or fix find often rephrased during training, make it difficult model learn a reliable link source target negation. finally conduct intrinsic extrinsic probing tasks on showing NMT can distinguish non-negation tokens very well encode lot about hidden states but nevertheless leave room improvement.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neural Name Translation Improves Neural Machine Translation

In order to control computational complexity, neural machine translation (NMT) systems convert all rare words outside the vocabulary into a single unk symbol. Previous solution (Luong et al., 2015) resorts to use multiple numbered unks to learn the correspondence between source and target rare words. However, testing words unseen in the training corpus cannot be handled by this method. And it a...

متن کامل

Revisiting Pivot Language Approach for Machine Translation

This paper revisits the pivot language approach for machine translation. First, we investigate three different methods for pivot translation. Then we employ a hybrid method combining RBMT and SMT systems to fill up the data gap for pivot translation, where the sourcepivot and pivot-target corpora are independent. Experimental results on spoken language translation show that this hybrid method s...

متن کامل

Neural Machine Translation

Draft of textbook chapter on neural machine translation. a comprehensive treatment of the topic, ranging from introduction to neural networks, computation graphs, description of the currently dominant attentional sequence-to-sequence model, recent refinements, alternative architectures and challenges. Written as chapter for the textbook Statistical Machine Translation. Used in the JHU Fall 2017...

متن کامل

Unsupervised Neural Machine Translation

In spite of the recent success of neural machine translation (NMT) in standard benchmarks, the lack of large parallel corpora poses a major practical problem for many language pairs. There have been several proposals to alleviate this issue with, for instance, triangulation and semi-supervised learning techniques, but they still require a strong cross-lingual signal. In this work, we completely...

متن کامل

Variational Neural Machine Translation

Models of neural machine translation are often from a discriminative family of encoder-decoders that learn a conditional distribution of a target sentence given a source sentence. In this paper, we propose a variational model to learn this conditional distribution for neural machine translation: a variational encoder-decoder model that can be trained end-to-end. Different from the vanilla encod...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Transactions of the Association for Computational Linguistics

سال: 2021

ISSN: ['2307-387X']

DOI: https://doi.org/10.1162/tacl_a_00395